Convergence in distribution of Self-Normalized Sup-Norms of Kernel Density Estimators
نویسندگان
چکیده
Let fn denote a kernel density estimator of a density f on the real line, for a bounded, compactly supported probability kernel. Under relatively weak smoothness conditions on f and K it is proved, for every 0 < β < 1/2, that the sequence Ân ( √ nhn ‖K‖2‖fn‖ ∞ sup t∈D̂an |fn(t) − f(t)| f n (t) − Ân ) converges in distribution to the double exponential law. Here Ân is constructed from the sample, an → ∞ as a power of n and D̂an = {t : fn(t) ≥ a−1 n }. Thus, this result provides distribution free asymptotic confidence bands for densities on the real line.
منابع مشابه
Kernel Density Estimators: Convergence in Distribution for Weighted Sup-norms
Let fn denote a kernel density estimator of a bounded continuous density f in the real line. Let Ψ(t) be a positive continuous function such that ‖Ψf‖∞<∞. Under natural smoothness conditions, necessary and sufficient conditions for the sequence √ nhn 2 log h−1 n supt∈R ∣∣Ψ(t)(fn(t)−Efn(t))∣∣ (properly centered and normalized) to converge in distribution to the double exponential law are obtaine...
متن کاملRates of Strong Uniform Consistency for Multivariate Kernel Density Estimators Vitesse De Convergence Uniforme Presque Sûre Pour Des Estimateurs À Noyaux De Densités Multivariées
– Let fn denote the usual kernel density estimator in several dimensions. It is shown that if {an} is a regular band sequence, K is a bounded square integrable kernel of several variables, satisfying some additional mild conditions ((K1) below), and if the data consist of an i.i.d. sample from a distribution possessing a bounded density f with respect to Lebesgue measure on R , then lim sup n→∞...
متن کاملThe Relative Improvement of Bias Reduction in Density Estimator Using Geometric Extrapolated Kernel
One of a nonparametric procedures used to estimate densities is kernel method. In this paper, in order to reduce bias of kernel density estimation, methods such as usual kernel(UK), geometric extrapolation usual kernel(GEUK), a bias reduction kernel(BRK) and a geometric extrapolation bias reduction kernel(GEBRK) are introduced. Theoretical properties, including the selection of smoothness para...
متن کاملAlmost Sure Convergence of Kernel Bivariate Distribution Function Estimator under Negative Association
Let {Xn ,n=>1} be a strictly stationary sequence of negatively associated random variables, with common distribution function F. In this paper, we consider the estimation of the two-dimensional distribution function of (X1, Xk+1) for fixed $K /in N$ based on kernel type estimators. We introduce asymptotic normality and properties and moments. From these we derive the optimal bandwidth...
متن کاملDensity Estimators for Truncated Dependent Data
In some long term studies, a series of dependent and possibly truncated lifetime data may be observed. Suppose that the lifetimes have a common continuous distribution function F. A popular stochastic measure of the distance between the density function f of the lifetimes and its kernel estimate fn is the integrated square error (ISE). In this paper, we derive a central limit theorem for t...
متن کامل